Generalization Performance of Regularization Networks and Support Vector Machines via Entropy Numbers of Compact Operators Produced as Part of the Esprit Working Group in Neural and Computational Learning Ii, Neurocolt2 27150
نویسنده
چکیده
We derive new bounds for the generalization error of kernel machines, such as support vector machines and related regularization networks by obtaining new bounds on their covering numbers. The proofs make use of a viewpoint that is apparently novel in the eld of statistical learning theory. The hypothesis class is described in terms of a linear operator mapping from a possibly innnite dimensional unit ball in feature space into a nite dimensional space. The covering numbers of the class are then determined via the entropy numbers of the operator. These numbers , which characterize the degree of compactness of the operator, can be bounded in terms of the eigenvalues of an integral operator induced by the kernel function used by the machine. As a consequence we are able to theoretically explain the eeect of the choice of kernel function on the generalization performance of support vector machines.
منابع مشابه
Generalization Performance of Regularization Networks and Support Vector Machines via Entropy Numbers of Compact Operators Produced as Part of the Esprit Working Group in Neural and Computational Learning Ii, Neurocolt2 27150 Introduction 1
We derive new bounds for the generalization error of kernel machines, such as support vector machines and related regularization networks by obtaining new bounds on their covering numbers. The proofs make use of a viewpoint that is apparently novel in the eld of statistical learning theory. The hypothesis class is described in terms of a linear operator mapping from a possibly innnite dimension...
متن کاملMultiplicative Updatings for Support-vector Learning Produced as Part of the Esprit Working Group in Neural and Computational Learning Ii, Neurocolt2 27150
Support Vector machines nd maximal margin hyperplanes in a high dimensional feature space. Theoretical results exist which guarantee a high generalization performance when the margin is large or when the number of support vectors is small. Multiplicative-Updating algorithms are a new tool for perceptron learning whose theoretical properties are well studied. In this work we present a Multiplica...
متن کاملDynamically Adapting Kernels in Support Vector Machines Produced as Part of the Esprit Working Group in Neural and Computational Learning Ii, Neurocolt2 27150
The kernel-parameter is one of the few tunable parameters in Support Vector machines, and controls the complexity of the resulting hypothesis. The choice of its value amounts to model selection, and is usually performed by means of a validation set. We present an algorithm which can automatically perform model selection and learning with no additional computational cost and with no need of a va...
متن کاملGeneralization Bounds via Eigenvalues of the Gram Matrix Produced as Part of the Esprit Working Group in Neural and Computational Learning Ii, Neurocolt2 27150
Model selection in Support Vector machines is usually carried out by minimizing the quotient of the radius of the smallest enclosing sphere of the data and the observed margin on the training set. We provide a new criterion taking the distribution within that sphere into account by considering the Gram matrix of the data. In particular, this makes use of the eigenvalue distribution of the matri...
متن کاملNew Support Vector Algorithms Produced as Part of the Esprit Working Group in Neural and Computational Learning Ii, Neurocolt2 27150 Introduction 1
We describe a new class of Support Vector algorithms for regression and classiication. In these algorithms, a parameter lets one eeectively control the number of Support Vectors. While this can be useful in its own right, the parametrization has the additional beneet of enabling us to eliminate one of the other free parameters of the algorithm: the accuracy parameter " in the regression case, a...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 1998